翻訳と辞書
Words near each other
・ Points Offthebench
・ Points on the Curve
・ Points per game
・ Points per minute
・ Points race
・ Points system
・ Points, West Virginia
・ Points-based immigration system (United Kingdom)
・ Points-of-parity/points-of-difference
・ Pointstream Software
・ PointTracker
・ PointUI Home
・ Pointvillers
・ Pointwise
・ Pointwise convergence
Pointwise mutual information
・ Pointwise product
・ Pointy ears
・ Pointy Records
・ Pointy-haired Boss
・ Pointy-nosed blue chimaera
・ Pointy-talky card
・ Pointz
・ Point–line–plane postulate
・ Poinville
・ Poinçon-lès-Larrey
・ Poio
・ Poionos
・ Poios thelei na ginei ekatommyriouchos
・ Poipet


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Pointwise mutual information : ウィキペディア英語版
Pointwise mutual information

Pointwise mutual information (PMI), or point mutual information, is a measure of association used in information theory and statistics. In contrast to mutual information (MI) which builds upon PMI, it refers to single events, whereas MI refers to the average of all possible events.
==Definition==
The PMI of a pair of outcomes ''x'' and ''y'' belonging to discrete random variables ''X'' and ''Y'' quantifies the discrepancy between the probability of their coincidence given their joint distribution and their individual distributions, assuming independence. Mathematically:
:
\operatorname(x;y) \equiv \log\frac = \log\frac = \log\frac.

The mutual information (MI) of the random variables ''X'' and ''Y'' is the expected value of the PMI over all possible outcomes (with respect to the joint distribution p(x,y)).
The measure is symmetric (\operatorname(x;y)=\operatorname(y;x)). It can take positive or negative values, but is zero if ''X'' and ''Y'' are independent. Note that even though PMI may be negative or positive, its expected outcome over all joint events (MI) is positive. PMI maximizes when ''X'' and ''Y'' are perfectly associated (i.e. p(x|y) or p(y|x)=1), yielding the following bounds:
:
-\infty \leq \operatorname(x;y) \leq \min\left(-\log p(x), -\log p(y) \right ) .

Finally, \operatorname(x;y) will increase if p(x|y) is fixed but p(x)decreases.
Here is an example to illustrate:
Using this table we can marginalize to get the following additional table for the individual distributions:
With this example, we can compute four values for pmi(x;y). Using base-2 logarithms:
(For reference, the mutual information \operatorname(X;Y) would then be 0.214170945)

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Pointwise mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.